mobile manipulation robot
MHRC: Closed-loop Decentralized Multi-Heterogeneous Robot Collaboration with Large Language Models
Yu, Wenhao, Peng, Jie, Ying, Yueliang, Li, Sai, Ji, Jianmin, Zhang, Yanyong
The integration of large language models (LLMs) with robotics has significantly advanced robots' abilities in perception, cognition, and task planning. The use of natural language interfaces offers a unified approach for expressing the capability differences of heterogeneous robots, facilitating communication between them, and enabling seamless task allocation and collaboration. Currently, the utilization of LLMs to achieve decentralized multi-heterogeneous robot collaborative tasks remains an under-explored area of research. In this paper, we introduce a novel framework that utilizes LLMs to achieve decentralized collaboration among multiple heterogeneous robots. Our framework supports three robot categories, mobile robots, manipulation robots, and mobile manipulation robots, working together to complete tasks such as exploration, transportation, and organization. We developed a rich set of textual feedback mechanisms and chain-of-thought (CoT) prompts to enhance task planning efficiency and overall system performance. The mobile manipulation robot can adjust its base position flexibly, ensuring optimal conditions for grasping tasks. The manipulation robot can comprehend task requirements, seek assistance when necessary, and handle objects appropriately. Meanwhile, the mobile robot can explore the environment extensively, map object locations, and communicate this information to the mobile manipulation robot, thus improving task execution efficiency. We evaluated the framework using PyBullet, creating scenarios with three different room layouts and three distinct operational tasks. We tested various LLM models and conducted ablation studies to assess the contributions of different modules. The experimental results confirm the effectiveness and necessity of our proposed framework.
- North America > United States > North Carolina (0.04)
- Asia > Middle East > Republic of Türkiye > Karaman Province > Karaman (0.04)
- Asia > China > Anhui Province > Hefei (0.04)
- Information Technology > Artificial Intelligence > Robots (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Undirected Networks > Markov Models (0.47)
How robotics can fit your operation
If you wanted to create the perfect mix of conditions to trigger the growth of a type of automated equipment for distribution center operations, you couldn't do much better than the factors lining up in favor of robotics. E-commerce and omni-channel fulfillment are driving labor-intensive piece picking, and warehouse labor is increasingly difficult to find and retain. DC operators know they need to automate to reduce these challenges, but few operations are able to plunk down millions for traditional automation projects that carry long payback times and might be difficult to reconfigure. Enter a new generation of robots, based on technology that uses on-board sensors and a more "natural" approach to navigation. Research shows high interest in robotics. According to MHI's "2018 Annual Industry Study", 65% of professionals surveyed now see robotics and automation as a major source of disruption and/or opportunity, up from 61% last year.
How robotics can fit your operation
If you wanted to create the perfect mix of conditions to trigger the growth of a type of automated equipment for distribution center operations, you couldn't do much better than the factors lining up in favor of robotics. E-commerce and omni-channel fulfillment are driving labor-intensive piece picking, and warehouse labor is increasingly difficult to find and retain. DC operators know they need to automate to reduce these challenges, but few operations are able to plunk down millions for traditional automation projects that carry long payback times and might be difficult to reconfigure. Enter a new generation of robots, based on technology that uses on-board sensors and a more "natural" approach to navigation. Research shows high interest in robotics. According to MHI's "2018 Annual Industry Study", 65% of professionals surveyed now see robotics and automation as a major source of disruption and/or opportunity, up from 61% last year.
IoT and robotics, evolving together
Bringing IoT to a factory floor is as much about robots as it is about any other class of device. Meanwhile, robots are more attractive in their own right because labor costs are rising globally. Although robotics for automation generally focuses on movement and manipulations, IoT and robotics involves a world of devices in the field -- devices that depend on electronic sensors and software. "A lot of companies have outsourced for low-cost labor, but labor in China is going up in cost by 15% a year," said Jim Lawton, COO at Rethink Robotics Inc., a company that focuses on designing robots that work well with humans. In fact, he noted, almost every manufacturing company he has spoken with in recent years complains about the lack of labor; temporary help firms are hired to bring in workers and by lunch time, half of them are gone, he said.
- Asia > China (0.25)
- North America > United States > Texas (0.05)
Learning Probabilistic Models for Mobile Manipulation Robots
Sturm, Jürgen (Technical University of Munich) | Burgard, Wolfram (University of Freiburg)
Mobile manipulation robots are envisioned to provide many useful services both in domestic environments as well as in the industrial context. In this paper, we present novel approaches to allow mobile maniplation systems to autonomously adapt to new or changing situations. The approaches developed in this paper cover the following four topics: (1) learning the robot's kinematic structure and properties using actuation and visual feedback, (2) learning about articulated objects in the environment in which the robot is operating, (3) using tactile feedback to augment visual perception, and (4) learning novel manipulation tasks from human demonstrations.
- Research Report (0.73)
- Overview (0.73)